Multilinear Compressive Learning

نویسندگان

چکیده

Compressive learning (CL) is an emerging topic that combines signal acquisition via compressive sensing (CS) and machine to perform inference tasks directly on a small number of measurements. Many data modalities naturally have multidimensional or tensorial format, with each dimension tensor mode representing different features such as the spatial temporal information in video sequences spectral hyperspectral images. However, existing CL frameworks, CS component utilizes either random learned linear projection vectorized acquisition, thus discarding structure signals. In this article, we propose multilinear (MCL), framework takes into account nature signals step builds subsequent model structurally sensed Our theoretical complexity analysis shows proposed more efficient compared its vector-based counterpart both memory computation requirement. With extensive experiments, also empirically show our MCL outperforms object classification face recognition tasks, scales favorably when dimensionalities original increase, making it highly for high-dimensional

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Online Multilinear Dictionary Learning for Sequential Compressive Sensing

A method for online tensor dictionary learning is proposed. With the assumption of separable dictionaries, tensor contraction is used to diminish a N -way model ofO (

متن کامل

Multilinear Multitask Learning

Many real world datasets occur or can be arranged into multi-modal structures. With such datasets, the tasks to be learnt can be referenced by multiple indices. Current multitask learning frameworks are not designed to account for the preservation of this information. We propose the use of multilinear algebra as a natural way to model such a set of related tasks. We present two learning methods...

متن کامل

From Compressive Clustering to Compressive Learning

It is often useful to fit a probability model to a data collection, in order to concisely represent the data, to feed learning algorithms that work on densities, to extract features or, simply, to uncover underlying structures. A particularly popular probability model is the Gaussian Mixture Model (GMM). Among many other applications, GMM form a central tool to build time-frequency models of au...

متن کامل

Compressive Feature Learning

This paper addresses the problem of unsupervised feature learning for text data. Our method is grounded in the principle of minimum description length and uses a dictionary-based compression scheme to extract a succinct feature set. Specifically, our method finds a set of word k-grams that minimizes the cost of reconstructing the text losslessly. We formulate document compression as a binary op...

متن کامل

Learning Dynamic Compressive Sensing Models

Random sampling in compressive sensing (CS) enables the compression of large amounts of input signals in an efficient manner, which is useful for many applications. CS reconstructs the compressed signals exactly with overwhelming probability when incoming data can be sparsely represented with a fixed number of components, which is one of the drawbacks of CS frameworks because the signal sparsit...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE transactions on neural networks and learning systems

سال: 2021

ISSN: ['2162-237X', '2162-2388']

DOI: https://doi.org/10.1109/tnnls.2020.2984831